38,449 research outputs found

    Experimental Realization of 1→21 \to 2 Asymmetric Phase-Covariant Quantum Cloning

    Get PDF
    While exact cloning of an unknown quantum state is prohibited by the linearity of quantum mechanics, approximate cloning is possible and has been used, e.g., to derive limits on the security of quantum communication protocols. In the case of asymmetric cloning, the information from the input state is distributed asymmetrically between the different output states. Here, we consider asymmetric phase-covariant cloning, where the goal is to optimally transfer the phase information from a single input qubit to different output qubits. We construct an optimal quantum cloning machine for two qubits that does not require ancilla qubits and implement it on an NMR quantum information processor.Comment: 6 pages, 5 figure

    Uniqueness of Nash equilibria in quantum Cournot duopoly game

    Full text link
    A quantum Cournot game of which classical form game has multiple Nash equilibria is examined. Although the classical equilibria fail to be Pareto optimal, the quantum equilibrium exhibits the following two properties, (i) if the measurement of entanglement between strategic variables chosen by the competing firms is sufficiently large, the multiplicity of equilibria vanishes, and, (ii) the more strongly the strategic variables are entangled, the more closely the unique equilibrium approaches to the optimal one.Comment: 7 pages, 2 figure

    Effective Mass of the Four Flux Composite Fermion at ν=1/4\nu = 1/4

    Full text link
    We have measured the effective mass (m∗m^*) of the four flux composite fermion at Landau level filling factor ν=1/4\nu = 1/4 (4^4CF), using the activation energy gaps at the fractional quantum Hall effect (FQHE) states ν\nu = 2/7, 3/11, and 4/15 and the temperature dependence of the Shubnikov-de Haas (SdH) oscillations around ν=1/4\nu = 1/4. We find that the energy gaps show a linear dependence on the effective magnetic field BeffB_{eff} (≡B−Bν=1/4\equiv B-B_{\nu=1/4}), and from this linear dependence we obtain m∗=1.0mem^* = 1.0 m_e and a disorder broadening Γ∼\Gamma \sim 1 K for a sample of density n=0.87×1011n = 0.87 \times 10^{11} /cm2^2. The m∗m^* deduced from the temperature dependence of the SdH effect shows large differences for ν>1/4\nu > 1/4 and ν<1/4\nu < 1/4. For ν>1/4\nu > 1/4, m∗∼1.0mem^* \sim 1.0 m_e. It scales as Bν\sqrt{B_{\nu}} with the mass derived from the data around ν=1/2\nu =1/2 and shows an increase in m∗m^* as ν→1/4\nu \to 1/4, resembling the findings around ν=1/2\nu =1/2. For ν<1/4\nu < 1/4, m∗m^* increases rapidly with increasing BeffB_{eff} and can be described by m∗/me=−3.3+5.7×Beffm^*/m_e = -3.3 + 5.7 \times B_{eff}. This anomalous dependence on BeffB_{eff} is precursory to the formation of the insulating phase at still lower filling.Comment: 5 pages, 3 figure

    Two-Photon Beatings Using Biphotons Generated from a Two-Level System

    Full text link
    We propose a two-photon beating experiment based upon biphotons generated from a resonant pumping two-level system operating in a backward geometry. On the one hand, the linear optical-response leads biphotons produced from two sidebands in the Mollow triplet to propagate with tunable refractive indices, while the central-component propagates with unity refractive index. The relative phase difference due to different refractive indices is analogous to the pathway-length difference between long-long and short-short in the original Franson interferometer. By subtracting the linear Rayleigh scattering of the pump, the visibility in the center part of the two-photon beating interference can be ideally manipulated among [0, 100%] by varying the pump power, the material length, and the atomic density, which indicates a Bell-type inequality violation. On the other hand, the proposed experiment may be an interesting way of probing the quantum nature of the detection process. The interference will disappear when the separation of the Mollow peaks approaches the fundamental timescales for photon absorption in the detector.Comment: to appear in Phys. Rev. A (2008

    Numerical simulation study of the dynamical behavior of the Niedermayer algorithm

    Full text link
    We calculate the dynamic critical exponent for the Niedermayer algorithm applied to the two-dimensional Ising and XY models, for various values of the free parameter E0E_0. For E0=−1E_0=-1 we regain the Metropolis algorithm and for E0=1E_0=1 we regain the Wolff algorithm. For −1<E0<1-1<E_0<1, we show that the mean size of the clusters of (possibly) turned spins initially grows with the linear size of the lattice, LL, but eventually saturates at a given lattice size L~\widetilde{L}, which depends on E0E_0. For L>L~L>\widetilde{L}, the Niedermayer algorithm is equivalent to the Metropolis one, i.e, they have the same dynamic exponent. For E0>1E_0>1, the autocorrelation time is always greater than for E0=1E_0=1 (Wolff) and, more important, it also grows faster than a power of LL. Therefore, we show that the best choice of cluster algorithm is the Wolff one, when compared to the Nierdermayer generalization. We also obtain the dynamic behavior of the Wolff algorithm: although not conclusive, we propose a scaling law for the dependence of the autocorrelation time on LL.Comment: Accepted for publication in Journal of Statistical Mechanics: Theory and Experimen

    Case-control study of stroke and the quality of hypertension control in north west England

    Get PDF
    Objective: To examine the risk of stroke in relation to quality of hypertension control in routine general practice across an entire health district. Design: Population based matched case-control study. Setting: East Lancashire Health District with a participating population of 388,821 aged < or = 80. Subjects: Cases were patients under 80 with their first stroke identified from a population based stroke register between 1 July 1994 and 30 June 1995. For each case two controls matched with the case for age and sex were selected from the same practice register. Hypertension was defined as systolic blood pressure > or = 160 mm Hg or diastolic blood pressure > or = 95 mm Hg, or both, on at least two occasions within any three month period or any history of treatment with antihypertensive drugs. Main outcome measures: Prevalence of hypertension and quality of control of hypertension assessed by using the mean blood pressure recorded before stroke) and odds ratios of stroke (derived from conditional logistic regression). Results: Records of 267 cases and 534 controls were examined; 61% and 42% of these subjects respectively were hypertensive. Compared with non-hypertensive subjects hypertensive patients receiving treatment whose average pre-event systolic blood pressure was controlled to or = 160 mm Hg) or untreated had progressively raised odds ratios of 1.6, 2.2, 3.2, and 3.5 respectively. Results for diastolic pressure were similar; both were independent of initial pressures before treatment. Around 21% of strokes were thus attributable to inadequate control with treatment, or 46 first events yearly per 100,000 population aged 40-79. Conclusions: Risk of stroke was clearly related to quality of control of blood pressure with treatment. In routine practice consistent control of blood pressure to below 150/90 mm Hg seems to be required for optimal stroke prevention
    • …
    corecore